Riemannian Multigrid Line Search for Low-Rank Problems

نویسندگان

چکیده

Related DatabasesWeb of Science You must be logged in with an active subscription to view this.Article DataHistorySubmitted: 13 May 2020Accepted: 02 March 2021Published online: 20 2021Keywordslow-rank matrices, optimization on manifolds, multilevel optimization, Riemannian retraction-based line search, roundoff errorAMS Subject Headings65F10, 65N22, 65F50, 65K10Publication DataISSN (print): 1064-8275ISSN (online): 1095-7197Publisher: Society for Industrial and Applied MathematicsCODEN: sjoce3

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Riemannian rank-adaptive method for low-rank optimization

This paper presents an algorithm that solves optimization problems on a matrix manifold M ⊆ Rm×n with an additional rank inequality constraint. The algorithm resorts to well-known Riemannian optimization schemes on fixed-rank manifolds, combined with new mechanisms to increase or decrease the rank. The convergence of the algorithm is analyzed and a weighted low-rank approximation problem is use...

متن کامل

Low-Rank Matrix Completion by Riemannian Optimization

The matrix completion problem consists of finding or approximating a low-rank matrix based on a few samples of this matrix. We propose a novel algorithm for matrix completion that minimizes the least square distance on the sampling set over the Riemannian manifold of fixed-rank matrices. The algorithm is an adaptation of classical non-linear conjugate gradients, developed within the framework o...

متن کامل

Low-Rank Tensor Completion by Riemannian Optimization∗

In tensor completion, the goal is to fill in missing entries of a partially known tensor under a low-rank constraint. We propose a new algorithm that performs Riemannian optimization techniques on the manifold of tensors of fixed multilinear rank. More specifically, a variant of the nonlinear conjugate gradient method is developed. Paying particular attention to the efficient implementation, ou...

متن کامل

Fixed-rank matrix factorizations and Riemannian low-rank optimization

Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of fixed-rank matrices. We adopt the geometric framework of optimization on Riemannian quotient manifolds. We study the underlying geometries of several well-known fixed-rank matrix factorizations and...

متن کامل

Guarantees of Riemannian Optimization for Low Rank Matrix Completion

We study the Riemannian optimization methods on the embedded manifold of low rank matrices for the problem of matrix completion, which is about recovering a low rank matrix from its partial entries. Assume m entries of an n× n rank r matrix are sampled independently and uniformly with replacement. We first prove that with high probability the Riemannian gradient descent and conjugate gradient d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Scientific Computing

سال: 2021

ISSN: ['1095-7197', '1064-8275']

DOI: https://doi.org/10.1137/20m1337430